skip to main content


Search for: All records

Creators/Authors contains: "Mark, Brian L."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Stochastic network calculus involves the use of a traffic bound or envelope to make admission control and resource allocation decisions for providing end-to-end quality-of-service guarantees. To apply network calculus in practice, the traffic envelope should: (i) be readily determined for an arbitrary traffic source, (ii) be enforceable by traffic regulation, and (iii) yield statistical multiplexing gain. Existing traffic envelopes typically satisfy at most two of these properties. A well-known traffic envelope based on the moment generating function (MGF) of the arrival process satisfies only the third property. We propose a new traffic envelope based on the MGF of the workload process obtained from offering the traffic to a constant service rate queue. We show that this traffic workload envelope can achieve all three properties and leads to a framework for a network service that provides stochastic delay guarantees. We demonstrate the performance of the traffic workload envelope with two bursty traffic models: Markov on-off fluid and Markov modulated Poisson Process (MMPP). 
    more » « less
  2. Network tomography aims at estimating source–destination traffic rates from link traffic measurements. This inverse problem was formulated by Vardi in 1996 for Poisson traffic over networks operating under deterministic as well as random routing regimes. In this article, we expand Vardi's second-order moment matching rate estimation approach to higher-order cumulant matching with the goal of increasing the column rank of the mapping and consequently improving the rate estimation accuracy. We develop a systematic set of linear cumulant matching equations and express them compactly in terms of the Khatri–Rao product. Both least squares estimation and iterative minimum I-divergence estimation are considered. We develop an upper bound on the mean squared error (MSE) in least squares rate estimation from empirical cumulants. We demonstrate that supplementing Vardi's approach with the third-order empirical cumulant reduces its minimum averaged normalized MSE in rate estimation by almost 20% when iterative minimum I-divergence estimation was used. 
    more » « less
  3. Providing end-to-end network delay guarantees in packet-switched networks such as the Internet is highly desirable for mission-critical and delay-sensitive data transmission, yet it remains a challenging open problem. Since deterministic bounds are based on the worst-case traffic behavior, various frameworks for stochastic network calculus have been proposed to provide less conservative, probabilistic bounds on network delay, at least in theory. However, little attention has been devoted to the problem of regulating traffic according to stochastic burstiness bounds, which is necessary in order to guarantee the delay bounds in practice. We design and analyze a stochastic traffic regulator that can be used in conjunction with results from stochastic network calculus to provide probabilistic guarantees on end-to-end network delay. Two alternative implementations of the stochastic regulator are developed and compared. Numerical results are provided to demonstrate the performance of the proposed stochastic traffic regulator. 
    more » « less
  4. Network tomography aims at estimating source-destination traffic rates from link traffic measurements. This inverse problem was formulated by Vardi in 1996 for independent Poisson traffic over networks operating under deterministic as well as random routing regimes. Vardi used a second-order moment matching approach to estimate the rates where a solution for the resulting linear matrix equation was obtained using an iterative minimum I-divergence procedure. Vardi’s second-order moment matching approach was recently extended to higher order cumulant matching approach with the goal of improving the rank of the system of linear equations. In this paper we go one step further and develop a moment generating function matching approach for rate estimation, and seek a least squares as well as an iterative minimum I-divergence solution of the resulting linear equations. We also specialize this approach to a characteristic function matching approach which exhibits some advantages. These follow from the fact that the characteristic function matching approach results in fewer conflicting equations involving the empirical estimates. We demonstrate that the new approach outperforms the cumulant matching approach while being conceptually simpler. 
    more » « less
  5. null (Ed.)
    The provisioning of delay guarantees in packet-switched networks such as the Internet remains an important, yet challenging open problem. We propose and evaluate a framework, based on results from stochastic network calculus, for guaranteeing stochastic bounds on network delay at a statistical multiplexer. The framework consists of phase-type traffic bounds and moment generating function traffic envelopes, stochastic traffic regulators to enforce the traffic bounds, and an admission control scheme to ensure that a stochastic delay bound is maintained for a given set of flows. Through numerical examples, we show that a stochastic delay bound is maintained at the multiplexer, and contrast the proposed framework to an approach based on deterministic network calculus. 
    more » « less
  6. null (Ed.)
    This paper assesses the feasibility of a novel dynamic spectrum sharing approach for a cellular downlink based on cognitive overlay to allow non-orthogonal cellular transmissions from a primary and a secondary radio access technology concurrently on the same radio resources. The 2-user Gaussian cognitive interference channel is used to model a downlink scenario in which the primary and secondary base stations are co-located. A system architecture is defined that addresses practical challenges associated with cognitive overlay, in particular the noncausal knowledge of the primary user message at the cognitive transmitter. A cognitive overlay scheme is applied that combines superposition coding with dirty paper coding, and a primary user protection criterion is derived that is specific to a scenario in which the primary system is 4G while the secondary system is 5G. Simulation is used to evaluate the achievable signal-to-interference-plus-noise ratio (SINR) at the 4G and 5G receivers, as well as the cognitive power allocation parameter as a function of distance. Results suggest that the cognitive overlay scheme is feasible when the distance to the 5G receiver is relatively small, even when a large majority of the secondary user transmit power is allocated to protecting the primary user transmission. Achievable link distances for the 5G receiver are on the order of hundreds of meters for an urban macrocell or a few kilometers for a rural macrocell. 
    more » « less
  7. null (Ed.)
    In a wireless network with dynamic spectrum sharing, tracking temporal spectrum holes across a wide spectrum band is a challenging task. We consider a scenario in which the spectrum is divided into a large number of bands or channels, each of which has the potential to provide dynamic spectrum access opportunities. The occupancy times of each band by primary users are generally non-exponentially distributed. We develop an approach to determine and parameterize a small selected subset of the bands with good spectrum access opportunities, using limited computational resources under noisy measurements. We model the noisy measurements of the received signal in each band as a bivariate Markov modulated Gaussian process, which can be viewed as a continuous-time bivariate Markov chain observed through Gaussian noise. The underlying bivariate Markov process allows for the characterization of non-exponentially distributed state sojourn times. The proposed scheme combines an online expectation-maximization algorithm for parameter estimation with a computing budget allocation algorithm. Observation time is allocated across the bands to determine the subset of G out of G frequency bands with the largest mean idle times for dynamic spectrum access and at the same time to obtain accurate parameter estimates for this subset of bands. Our simulation results show that when channel holding times are non-exponential, the proposed scheme achieves a substantial improvement in the probability of correct selection of the best subset of bands compared to an approach based on a (univariate) Markov modulated Gaussian process model. 
    more » « less
  8. null (Ed.)
    We extend network tomography to traffic flows that are not necessarily Poisson random processes. This assumption has governed the field since its inception in 1996 by Y. Vardi. We allow the distribution of the packet count of each traffic flow in a given time interval to be a mixture of Poisson random variables. Both discrete as well as continuous mixtures are studied. For the latter case, we focus on mixed Poisson distributions with Gamma mixing distribution. As is well known, this mixed Poisson distribution is the negative binomial distribution. Other mixing distributions, such as Wald or the inverse Gaussian distribution can be used. Mixture distributions are overdispersed with variance larger than the mean. Thus, they are more suitable for Internet traffic than the Poisson model. We develop a second-order moment matching approach for estimating the mean traffic rate for each source-destination pair using least squares and the minimum I-divergence iterative procedure. We demonstrate the performance of the proposed approach by several numerical examples. The results show that the averaged normalized mean squared error in rate estimation is of the same order as in the classic Poisson based network tomography. Furthermore, no degradation in performance was observed when traffic rates are Poisson but Poisson mixtures are assumed. 
    more » « less
  9. Providing end-to-end network delay guarantees in packet-switched networks such as the Internet is highly desirable for mission-critical and delay-sensitive data transmission, yet it remains a challenging open problem. Due to the looseness of the deterministic bounds, various frameworks for stochastic network calculus have been proposed to provide tighter, probabilistic bounds on network delay, at least in theory. However, little attention has been devoted to the problem of regulating traffic according to stochastic burstiness bounds, which is necessary in order to guarantee the delay bounds in practice. We propose and analyze a stochastic traffic regulator that can be used in conjunction with results from stochastic network calculus to provide probabilistic guarantees on end-to-end network delay. Numerical results are provided to demonstrate the performance of the proposed traffic regulator. 
    more » « less
  10. Network traffic is difficult to characterize due to its random, bursty nature. Even if a traffic source could be fit to a stochastic model with reasonable accuracy, analysis of end-to-end network performance metrics for such traffic models is generally intractable. In prior work, an approach to characterize traffic burstiness using a bound based on the class of phase-type distributions was proposed. Such phase-type bounds could be applied in conjunction with stochastic network calculus to derive probabilistic end-to-end delay bounds for a traffic stream. In this paper, we focus on the problem of estimating a tight phase-type burstiness bound for a given traffic trace. We investigate a method based on least squares and another based on the expectation-maximization algorithm. Our numerical results compare the two approaches in the scenario of a heavy-tailed M/G/1 queue. We find that while both methods are viable approaches for deriving phase-type bounds on traffic burstiness, the least squares approach performs better, particularly when a tail limit is imposed. 
    more » « less